Sparsity-Aware Learning and Compressed Sensing: An Overview

نویسندگان

  • Sergios Theodoridis
  • Yannis Kopsinis
  • Konstantinos Slavakis
چکیده

The notion of regularization has been widely used as a tool to address a number of problems that are usually encountered in Machine Learning. Improving the performance of an estimator by shrinking the norm of the MVU estimator, guarding against overfitting, coping with ill-conditioning, providing a solution to an underdetermined set of equations, are some notable examples where regularization has provided successful answers. A notable example is the ridge regression concept, where the LS loss function is combined, in a tradeoff rationale, with the Euclidean norm of the desired solution. In this paper, our interest will be on alternatives to the Euclidean norms and in particular the focus will revolve around the `1 norm; this is the sum of the absolute values of the components comprising a vector. Although seeking a solution to a problem via the `1 norm regularization of a loss function has been known and used since the 1970s, it is only recently that has become

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust regression in RKHS - An overview

The paper deals with the task of robust nonlinear regression in the presence of outliers. The problem is dealt in the context of reproducing kernel Hilbert spaces (RKHS). In contrast to more classical approaches, a recent trend is to model the outliers as a sparse vector noise component and mobilize tools from the sparsity-aware/compressed sensing theory to impose sparsity on it. In this paper,...

متن کامل

Task-Aware Compressed Sensing with Generative Adversarial Networks

In recent years, neural network approaches have been widely adopted for machine learning tasks, with applications in computer vision. More recently, unsupervised generative models based on neural networks have been successfully applied to model data distributions via low-dimensional latent spaces. In this paper, we use Generative Adversarial Networks (GANs) to impose structure in compressed sen...

متن کامل

Bayesian compressed sensing with new sparsity-inducing prior

Sparse Bayesian learning (SBL) is a popular approach to sparse signal recovery in compressed sensing (CS). In SBL, the signal sparsity information is exploited by assuming a sparsity-inducing prior for the signal that is then estimated using Bayesian inference. In this paper, a new sparsity-inducing prior is introduced and efficient algorithms are developed for signal recovery. The main algorit...

متن کامل

A Sharp Sufficient Condition for Sparsity Pattern Recovery

Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...

متن کامل

Sparsity Models for Hyperspectral Imaging and Compressive Sensing LIDAR

Virtually all types of imaging (including remote sensing) deal with high-dimensional signals that have low-dimensional structure that can be exploited. While classic models rely only on signals being bandlimited, more recent signal models are based on the notion that most signals can be written as a sum of just a few elements from a suitable dictionary (known as "sparsity"). Sparsity models hav...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1211.5231  شماره 

صفحات  -

تاریخ انتشار 2012